# Large-scale corpus training
Roberta Large 1160k
MIT
Multilingual RoBERTa large model trained on Nordic corpora, supporting Swedish, Norwegian, Danish, and English
Large Language Model
Transformers Supports Multiple Languages

R
AI-Sweden-Models
1,159
10
Madlad400 7b Mt
Apache-2.0
A multilingual machine translation model based on the T5 architecture, supporting 400+ languages, trained with 250 billion tokens
Machine Translation Supports Multiple Languages
M
google
4,450
15
Roberta Base Turkish Uncased
MIT
RoBERTa base model pre-trained on Turkish, trained with 38GB of Turkish corpus
Large Language Model
Transformers Other

R
burakaytan
57
16
Opus Mt Tc Big Lt En
This is a neural machine translation model for translating from Lithuanian to English, part of the OPUS-MT project.
Machine Translation
Transformers Supports Multiple Languages

O
Helsinki-NLP
312
2
Opus Mt Tc Big Sh En
This is a neural machine translation model for translating Serbian-Croatian (sh) to English (en), part of the OPUS-MT project.
Machine Translation
Transformers Supports Multiple Languages

O
Helsinki-NLP
28.03k
0
Opus Mt Tc Big Gmq En
This is a neural machine translation model for translating from North Germanic languages (gmq) to English (en), part of the OPUS-MT project.
Machine Translation
Transformers Supports Multiple Languages

O
Helsinki-NLP
552
0
Opus Mt Tc Big En Es
English-to-Spanish neural machine translation model provided by the OPUS-MT project, based on transformer-big architecture
Machine Translation
Transformers Supports Multiple Languages

O
Helsinki-NLP
29.31k
14
Opus Mt Tc Big En It
This is a neural machine translation model for English to Italian translation, part of the OPUS-MT project, using transformer-big architecture.
Machine Translation
Transformers Supports Multiple Languages

O
Helsinki-NLP
16.22k
5
Opus Mt Tc Big En Gmq
This is a neural machine translation model for translating from English to North Germanic languages (including Danish, Faroese, Icelandic, Norwegian Bokmål, Norwegian Nynorsk, and Swedish), part of the OPUS-MT project.
Machine Translation
Transformers Supports Multiple Languages

O
Helsinki-NLP
372
3
Opus Mt Tc Big En Cat Oci Spa
This is a neural machine translation model for translating from English to Catalan, Occitan, and Spanish, part of the OPUS-MT project.
Machine Translation
Transformers Supports Multiple Languages

O
Helsinki-NLP
30
4
Opus Mt Tc Big De Zle
This is a neural machine translation model for translating from German to East Slavic languages (Belarusian, Russian, Ukrainian), part of the OPUS-MT project.
Machine Translation
Transformers Supports Multiple Languages

O
Helsinki-NLP
63
0
Mbarthez
Apache-2.0
BARThez is a French sequence-to-sequence pre-trained model based on the BART architecture, particularly suitable for generative tasks such as abstractive summarization.
Large Language Model
Transformers French

M
moussaKam
1,032
6
Norbert
NorBERT is a BERT model optimized for Norwegian, developed by the Language Technology Group at the University of Oslo. It is part of the NorLM initiative, aiming to provide high-quality language models for Norwegian.
Large Language Model Other
N
ltg
199
7
Icebert
Icelandic masked language model trained on RoBERTa-base architecture using 16GB of Icelandic text data
Large Language Model
Transformers Other

I
mideind
1,203
3
Bert Base Arabertv02
AraBERT is an Arabic pre-trained language model based on the BERT architecture, specifically optimized for Arabic language understanding tasks.
Large Language Model Arabic
B
aubmindlab
666.17k
35
Roberta Tagalog Base
A RoBERTa model developed for Tagalog (Filipino), trained using the TLUnified corpus, supporting Filipino NLP tasks
Large Language Model
Transformers Other

R
jcblaise
710
4
Bert Base Arabert
AraBERT is an Arabic pre-trained language model based on Google's BERT architecture, specifically designed for Arabic natural language understanding tasks.
Large Language Model Arabic
B
aubmindlab
74.71k
29
Mt5 Xxl
Apache-2.0
mT5 is a multilingual text-to-text transformation model launched by Google, supporting 101 languages, pretrained on the mC4 dataset, and suitable for various NLP tasks.
Large Language Model
Transformers Supports Multiple Languages

M
google
7,532
68
Opus Mt Mul En
Apache-2.0
This is a Transformer-based multilingual-to-English machine translation model supporting over 100 languages.
Machine Translation
Transformers Supports Multiple Languages

O
Helsinki-NLP
173.61k
77
Opus Mt En Mul
Apache-2.0
This is a Transformer-based neural machine translation model from English to multiple languages, supporting translation tasks for over 100 target languages.
Machine Translation
Transformers Supports Multiple Languages

O
Helsinki-NLP
3,235
21
Sloberta
SloBERTa is a monolingual BERT-like model specifically optimized for the Slovenian language, developed based on the Camembert architecture.
Large Language Model
Transformers Other

S
EMBEDDIA
2,691
5
Opus Mt En Ine
Apache-2.0
This is a Transformer-based multilingual machine translation model supporting translation tasks from English to multiple Indo-European languages.
Machine Translation
Transformers Supports Multiple Languages

O
Helsinki-NLP
83
0
Mbart Large 50 Finetuned Opus En Pt Translation
This model is a fine-tuned version of the mBART-50 large model on the opus100 dataset for English to Portuguese translation tasks, supporting high-quality bilingual translation.
Machine Translation
Transformers Supports Multiple Languages

M
Narrativa
87
12
Featured Recommended AI Models